Top related persons:
Top related locs:
Top related orgs:

Search resuls for: "Matei Zaharia"


4 mentions found


A recent research paper revealed a new way to help AI models ingest way more data. Soon, you'll be able to put millions of words into context windows of AI models, researchers say. Bigger AI models can handle more, but only up to about 75,000. Massive context windowsThis Ring Attention method means that we should be able to put millions of words into the context windows of AI models, not just tens of thousands. AdvertisementAdvertisementThis chart shows some of the results of tests from the "Ring Attention" AI research paper.
Persons: you'll, , Matei Zaharia, Pieter Abbeel, Claude, That's, OpenAI's, Hao Liu, Liu Organizations: Service, Google, UC Berkeley, Databricks, Nvidia Locations: GPT
GPT-4 users have complained that the OpenAI model is getting 'dumber.' Their findings, published on Tuesday, challenge the assumption that AI models automatically improve. One of the bedrock assumptions of the current artificial intelligence boom is that AI models "learn" and improve over time. This is what users of OpenAI's GPT-4, the world's most-powerful AI model, have been experiencing lately. This recent GPT-4 research paper provides a healthy dose of skepticism to the assumptions that are driving these wild swings in value.
Persons: OpenAI's, OpenAI, Matei Zaharia, Zaharia, Gary Marcus, Marcus Organizations: Twitter, Microsoft Locations: OpenAI
It's not just you: new research suggests ChatGPT's AI model really is getting dumber. There's been a growing feeling for a while now that the AI model behind ChatGPT is, frankly, getting dumber. No one can quite figure out why GPT-4 is changingWhat the research doesn't seem to identify is why this performance drop has happened. As the AI model underlying a more advanced version of ChatGPT, one that paying subscribers get access to, that's a bit of a problem for OpenAI. That said, it's hard to ignore the questions of quality surrounding GPT-4 when a whole community of AI devotees is asking them.
Persons: It's, There's, OpenAI, Ethan Mollick, Wharton, OpenAI hasn't, Peter Yang, Alistair Barr, Peter Welinder, tweeting, Matei Zaharia, , Arvind Narayanan Organizations: Stanford, UC Berkeley, Morning, Stanford University, UC Berkeley — Locations: Princeton, GPT
The most important groundwork for building company culture was a strong founding team, Ghodsi says. Ghodsi arrived at UC Berkeley in 2009 for a year-long program to research machine learning and data processing. Working at European universities, Ghodsi says he was often shut down when proposing out-of-the-box research ideas, but "UC Berkeley was different. Ghodsi went on to cofound Databricks out of a UC Berkeley research lab in 2013. Databricks' founding team was extremely innovative, Ghodsi says, with backgrounds in research and creating open source data project Spark.
Total: 4